First Hitting Time Analysis of the Independence Metropolis Sampler

نویسندگان

  • Romeo Maciuca
  • Song-Chun Zhu
چکیده

In this paper, we study a special case of the Metropolis algorithm, the Independence Metropolis Sampler (IMS), in the finite state space case. The IMS is often used in designing components of more complex Markov Chain Monte Carlo algorithms. We present new results related to the first hitting time of individual states for the IMS. These results are expressed mostly in terms of the eigenvalues of the transition kernel. We derive a simple form formula for the mean first hitting time and we show tight lower and upper bounds on the mean first hitting time with the upper bound being the product of two factors: a “local” factor corresponding to the target state and a “global” factor, common to all the states, which is expressed in terms of the total variation distance between the target and the proposal probabilities. We also briefly discuss properties of the distribution of the first hitting time for the IMS and analyze its variance. We conclude by showing how some non-independence Metropolis–Hastings algorithms can perform better than the IMS and deriving general lower and upper bounds for the mean first hitting times of a Metropolis–Hastings algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

How Do Heuristics Expedite Markov Chain Search? Hitting-time Analysis of the Independence Metropolis Sampler

Solving vision problems often entails searching a solution space for optimal state(s) that has maximum Bayesian posterior probability or minimum energy. When the volume of the space is huge, exhaustive search becomes infeasible. Generic stochastic search (e.g. Markov chain Monte Carlo) could be even worse than exhaustive search as it may visit a state repeatedly. To expedite the Markov chain se...

متن کامل

Optimal scaling of the independence sampler: Theory and Practice

The independence sampler is one of the most commonly used MCMC algorithms usually as a component of a Metropolis-within-Gibbs algorithm. The common focus for the independence sampler is on the choice of proposal distribution to obtain an as high as possible acceptance rate. In this paper we have a somewhat different focus concentrating on the use of the independence sampler for updating augment...

متن کامل

On the use of auxiliary variables in Markov chain Monte Carlo sampling

We study the slice sampler, a method of constructing a reversible Markov chain with a speciied invariant distribution. Given an independence Metropolis-Hastings algorithm it is always possible to construct a slice sampler that dominates it in the Peskun sense. This means that the resulting Markov chain produces estimates with a smaller asymptotic variance. Furthermore the slice sampler has a sm...

متن کامل

On adaptive Metropolis-Hastings methods

This paper presents a method for adaptation in Metropolis-Hastings algorithms. A product of a proposal density and K copies of the target density is used to define a joint density which is sampled by a Gibbs sampler including a Metropolis step. This provides a framework for adaptation since the current value of all K copies of the target distribution can be used in the proposal distribution. Th...

متن کامل

Mcmc Methods for Diffusion Bridges

We present and study a Langevin MCMC approach for sampling nonlinear diffusion bridges. The method is based on recent theory concerning stochastic partial differential equations (SPDEs) reversible with respect to the target bridge, derived by applying the Langevin idea on the bridge pathspace. In the process, a Random-Walk Metropolis algorithm and an Independence Sampler are also obtained. The ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003